Distributed Mini-Batch SDCA
نویسندگان
چکیده
We present an improved analysis of mini-batched stochastic dual coordinate ascent for regularized empirical loss minimization (i.e. SVM and SVM-type objectives). Our analysis allows for flexible sampling schemes, including where data is distribute across machines, and combines a dependence on the smoothness of the loss and/or the data spread (measured through the spectral norm).
منابع مشابه
Dual Free Adaptive Mini-batch SDCA for Empirical Risk Minimization
In this paper we develop dual free mini-batch SDCA with adaptive probabilities for regularized empirical risk minimization. This work is motivated by recent work of Shai ShalevShwartz on dual free SDCA method, however, we allow a non-uniform selection of ”dual” coordinates in SDCA. Moreover, the probability can change over time, making it more efficient than fix uniform or non-uniform selection...
متن کاملRandomized Dual Coordinate Ascent with Arbitrary Sampling
We study the problem of minimizing the average of a large number of smooth convex functions penalized with a strongly convex regularizer. We propose and analyze a novel primal-dual method (Quartz) which at every iteration samples and updates a random subset of the dual variables, chosen according to an arbitrary distribution. In contrast to typical analysis, we directly bound the decrease of th...
متن کاملDual Solver for the Multiplicative Kernel Structural SVM
This manuscript describes the implementation of a mini-batch dual solver for the multiplicative kernel structural SVM used in [1]. The solver is written from scratch (except for wrapping around a QP solver), and uses dual coordinate ascent style updates, similar to SMO [2, 3, 4], SDCA [5], and D. Ramanan’s linear structural SVM solver [6]. The use of a mini-batch update strategy resulted in a 1...
متن کاملCommunication-Efficient Distributed Dual Coordinate Ascent
Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper, we propose a communication-efficient framework, COCOA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of al...
متن کاملAccelerated Mini-Batch Stochastic Dual Coordinate Ascent
Stochastic dual coordinate ascent (SDCA) is an effective technique for solving regularized loss minimization problems in machine learning. This paper considers an extension of SDCA under the minibatch setting that is often used in practice. Our main contribution is to introduce an accelerated minibatch version of SDCA and prove a fast convergence rate for this method. We discuss an implementati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1507.08322 شماره
صفحات -
تاریخ انتشار 2015